Infectious Diseases of Poverty
○ Springer Science and Business Media LLC
Preprints posted in the last 30 days, ranked by how well they match Infectious Diseases of Poverty's content profile, based on 10 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Tegenaw, G. S.; Degu, M. Z.; Gebeyehu, W. B.; Senay, A. B.; Krishnamoorthy, J.; Ward, T.; Simegn, G. L.
Show abstract
Effective public health planning and intervention strategies necessitate an understanding of the temporal and geographic distribution of disease incidences. This requires robust frameworks for disease incidence forecasting. However, due to variations in cases and temporal dynamics, grasping the distinct patterns of climate-sensitive diseases poses significant challenges, including identifying hotspots, trends, and seasonal variations in disease incidence. Furthermore, although most studies focus on directly predicting future incidence using historical patterns and covariates, a significant gap remains between methodological proliferation marked by diverse architectures, where models are trained and validated on benchmark datasets that are standardized and statistically stable, and epidemiological reality, which is often characterized by irregular, sparse, and highly skewed data, as well as rare but high-magnitude or bimodally distributed incidences. Hence, traditional end-to-end approaches that directly map climate and disease data often fail in these data-scarce settings due to overfitting and poor generalization. To understand disease epidemiology and mitigate the impact of incidence, we analyzed a decade of retrospective datasets in Ethiopia to examine how climate and weather conditions influence the incidence or spread of climate-sensitive diseases, including malaria and dysentery. In this study, we proposed a two-stage hybrid framework, a climate-informed disease prediction model, to forecast the likelihood of disease incidences using decades of climate and weather data. First, deep learning was applied to capture latent weather dynamics. Then, a hurdle model using Extreme Gradient Boosting (XGB) was designed for zero-inflated incidence data, combining XGBClassifier to predict incidence and XGBRegressor to estimate its size, based on weather dynamics to forecast disease incidence. Our proposed multivariate climate-driven disease incidence model incorporates both spatial (elevation, coordinates) and temporal (year, month) factors, along with key weather parameters (precipitation, sunlight, wind, relative humidity, temperature) to predict the likelihood of multiple diseases occurring in each area, serving as a foundation for future disease incidence predictions in the region. Out of 72 evaluated experiments across four categories and six targets, we found that the Transformer model showed highest number of statistically significant wins (n=18, 25.0%) comparison with Long Short-Term Memory (LSTM) (n=9, 12.5%) and the Temporal Convolutional Neural Network (TCN) (n=5, 6.9%) at climate variable forecasting using Pairwise Model Comparison Diebold-Mariano Test. The hurdle model that combines XGBClassifier and XGBRegressor outperformed the baseline in both Malaria and Dysentery forecasting. Error stratification revealed that the hurdle model provided the greatest benefit during incidence periods, as indicated by a substantially lower Mean Average Error (MAE) in both incidence and non-incidence periods than the baseline. Our proposed modular pipeline first forecasts climate variables, then predicts disease incidence, thereby enhancing interpretability and generalization in data-sparse settings. Overall, this approach provides a scalable, climate-aware forecasting tool for public health planning, particularly in regions where these diseases are endemic or where climate change may affect their prevalence, as well as in data-scarce settings.
Wang, Y.; WANG, D.; Lau, Y. C.; Du, Z.; Cowling, B. J.; Zhao, Y.; Ali, S. T.
Show abstract
Mainland China experienced multiple waves of COVID19 pandemic during 2020 2022, driven by emerging variants and changes in public health and social measures (PHSMs). We developed a hypergraph-based Susceptible Vaccinated Exposed Infectious Recovered Susceptible (SVEIRS) model to reconstruct epidemic dynamics across 31 provinces, capturing transmission heterogeneity associated with clustered contacts. We assessed key characteristics of transmission at national and provincial levels during four outbreak periods: initial, localized predelta, Delta, and widespread Omicron, which accounted for 96.7% of all infections. We found significant diversity in transmission contributions across cluster sizes, with a small fraction of larger clusters responsible for a disproportionate share of infections. Counterfactual analyses showed that reducing clustersize heterogeneity, while holding overall exposure constant, could have lowered national infections by 11.70 to 30.79%, with the largest effects during Omicron period. Ascertainment rates increased over time but remained spatially heterogeneous with a range: (14.40, 71.93)%. Population susceptibility declined following mass vaccination (to 42.49% in Aug 2021, nationally) and rebounded (to 89.89% in Nov 2022) due to waning immunity with variations across the provinces. Effective reproduction numbers displayed marked temporal and spatial variability, with higher estimates during Omicron. Overall, these results highlight critical role of group contact heterogeneity in shaping epidemic dynamics.
Wan, H.; Zhong, X.; Zhang, X.
Show abstract
Based on the 2023 Global Burden of Disease (GBD) database, this study analyzed the global burden of preterm birth from 1990 to 2023 and predicted its development trend by 2050, while exploring the disparities in disease burden across regions with different Socio-demographic Index (SDI) levels, income groups and countries. A retrospective trend analysis was conducted to collect data on preterm birth incidence, prevalence, death and disability-adjusted life years (DALYs) in 204 countries and regions worldwide from 1990 to 2023 from the GBD 2023 database. ARIMA model (p=2,d=1,q=1) and grey prediction model (GM(1,1)) were combined to predict the preterm birth burden from 2023 to 2050. In 2023, preterm birth was the primary cause of the global neonatal disease burden, with its four core indicators significantly higher than other neonatal diseases. From 1990 to 2023, the global incidence, death and DALYs of preterm birth decreased to 0.91, 0.44 and 0.52 times of the 1990 levels respectively, while the prevalence increased to 1.54 times of the baseline. Projection results showed that by 2050, the incidence, death and DALYs of preterm birth would drop to 0.79, 0.08 and 0.32 times of the 2023 levels, and the prevalence would rise to 1.23 times of 2023. Low SDI regions, lower-middle income countries, as well as India and Nigeria, bore the heaviest disease burden. Over the past three decades, the global acute health burden of preterm birth such as death has decreased notably, but the continuous rise in prevalence and severe regional and age disparities remain prominent public health challenges. The 0-6 days and 6-11 months age groups are the key time windows for preterm birth intervention. It is urgent to implement targeted prevention and control measures for low SDI regions and lower-middle income countries to reduce the global burden of preterm birth.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
PASAYAN, M. K.; Jiamsakul, A.; Yunihastuti, E.; Azwa, I.; Choi, J. Y.; Kumarasamy, N.; Avihingsanon, A.; Chaiwarith, R.; Chan, Y.-J.; Khol, V.; Kiertiburanakul, S.; Lee, M. P.; Somia, K. A.; Pujari, S.; Do, C. D.; Pham, T. N.; Zhang, F.; Khusuwan, S.; Ng, O. T.; Tanuma, J.; Gani, Y.; Borse, R.; Ross, J.; Ditangco, R.
Show abstract
IntroductionViral load (VL) testing is the recommended approach for monitoring antiretroviral therapy (ART) effectiveness, while guidelines recommend targeted CD4 testing after ART initiation. This study examined trends in VL and CD4 testing frequencies, as well as the relationship with AIDS diagnosis and mortality among people with HIV in the Asia-Pacific region. MethodsWe included adults enrolled in the Treat Asia HIV Observational Database (TAHOD) between 2003-2018 who had been on ART for [≥]1 year. VL and CD4 testing rates were analysed using Poisson regression models. Associations between testing frequency and AIDS diagnosis or mortality were evaluated using Fine and Gray competing risk regression. ResultsAmong 8,446 patients, VL testing rates remained steady at 1 per person-year (PYS) between 2003-2018. Increased VL testing was associated with more frequent CD4 testing (>2 tests in the previous year; IRR=1.57, 95%CI 1.53-1.60), later follow-up years (2008-2012: IRR=1.15, 95%CI 1.12-1.18; 2013-2015: IRR=1.07, 95%CI 1.04-1.10), older age (31-40 years: IRR=1.06, 95%CI 1.03-1.08; 41-50 years: IRR=1.08, 95%CI 1.05-1.11; >50 years: IRR=1.07, 95%CI 1.03-1.11), higher current VL (401-1000 copies/mL: IRR=1.16, 95%CI 1.09-1.24; >1000 copies/mL: IRR=1.07, 95%CI 1.04-1.11), initial ART regimen (NRTI+PI: IRR=1.07, 95%CI 1.04-1.10; other combinations: IRR=1.11, 95%CI 1.05-1.17), and higher country income levels (upper-middle: IRR=2.17, 95%CI 2.11-2.23; high: IRR=3.14, 95%CI 3.03-3.26). CD4 testing rates decreased from 2.04 to 1.06/PYS over the same period. Lower CD4 testing frequency was associated with HIV exposure mode (MSM: IRR=0.94, 95%CI 0.92-0.96; IDU: IRR=0.93, 95%CI 0.90-0.97; other/unknown: IRR=0.90, 95%CI 0.87-0.93), higher current CD4 (201-350 cells/{micro}L: IRR=0.95, 95%CI 0.93-0.97; 351-500 cells/{micro}L: IRR=0.89, 95%CI 0.87-0.91; >500 cells/{micro}L: IRR=0.85, 95%CI 0.83-0.87) and receiving an NRTI+PI first-line combination (IRR=0.96, 95% CI 0.94-0.98). VL and CD4 testing frequencies were not significantly associated with AIDS diagnosis. However, having > 2 CD4 tests in the previous year was associated with higher mortality risk. ConclusionThe trends in the rates for CD4 and VL testing in the region between 2003-2018 were significantly affected by demographic, clinical and socio-economic factors. Recognizing these factors is critical to optimizing differentiated monitoring strategies and improving outcomes for PWH in the region.
Augusto, D. A.; Abdalla, L.; Krempser, E.; de Oliveira Passos, P. H.; Garkauskas Ramos, D.; Pecego Martins Romano, A.; Chame, M.
Show abstract
Sylvatic Yellow Fever (YF) is an infectious mosquito-borne disease with significant epidemiological relevance due to its widespread distribution and high lethality for human and non-human primates, particularly in tropical regions of the planet such as in Brazil. Identifying regions and periods of high environmental suitability for the occurrence of YF is essential for preventing or mitigating its burden, as it enables the efficient allocation of surveillance efforts, prevention, and implementation of control measures. Environmental modeling of YF occurrence has proven to be an effective approach toward this goal; however, its effectiveness strongly depends on the modeling framework's capabilities as well as the spatial and temporal precision of all associated data. We propose a fine-scale geospatial modeling of YF environmental suitability that is based on a generative machine-learning ensemble method built on a large set of high-resolution environmental covariates. First, we take the spatiotemporal statistical description of the environment of each of the 545 YF cases from 2019--2024 up to 30 m/monthly resolution at three buffer scales: 100 m, 500 m, and 1000 m ratios. Then, we perform a feature selection and train hundreds of One-Class Support Vector Machine submodels to form a robust ensemble model, whose predictions are projected to a 1x1 km resolution grid of Brazil under several metrics, exceeding seven million ensemble evaluations. The predictions ranked the Southern Brazil region with the highest mean suitability for YF, with a level of 0.64; Southeast comes next with 0.46, followed closely by Central-West region (0.44), North (0.39), and finally Northeast (0.28). The model exhibited high uncertainty for the North region, indicating that data collection efforts are much needed in this region. As for the environmental covariates, a feature analysis pointed out that Land use and cover accounts for the largest influence in the model output.
Waldetensai, A.; Tasew, G.; Yewhalaw, D.; Takie, H.; Gidey, B.; Kinde, S.; Gemechu, F.; Yirga, S.; Kinfe, E.; Hailemariam, A.; Tadesse, H.; Solomon, H.; Assefa, G.; Dilu, D.; Bashaye, S.; Wuletaw, Y.; Abdulatif, B.; Kebede, T.; Tadiwos, S.; Gebrewold, G.; Hailu, S.; Tesfaye, F.; Tollera, G.; Hailu, M.; Guiyun, Y.; Eukubay, A.; Gebresillassie, A.
Show abstract
Background Malaria remains a critical global health challenge, with over 68% of Ethiopias population living in at-risk areas. While Long-Lasting Insecticidal Nets (LLINs) are a cornerstone of prevention, their effectiveness depends on consistent use. This study aimed to assess LLIN ownership and utilization patterns and identify socio-behavioral and physical determinants of their effectiveness in endemic communities. Methods A community-based, cross-sectional survey was conducted from October 2024 to January 2025 across 11 administrative regions in Ethiopia. Using a two-stage stratified cluster sampling technique, data were collected from 9,222 households (34,427 individuals) through face-to-face interviews and direct physical observations. Data analysis was performed using the SPSS Complex Samples module and hierarchical multivariable logistic regression. Results The survey found a household LLIN ownership rate of 71.5%, while the proportion of sufficient LLINs for every two people was 58.3%. Among those who owned nets, the overall utilization rate was 59.9%, with significantly higher rates in rural areas (72.7%) than in urban areas. Vulnerable groups achieved higher usage levels, specifically pregnant women (78.5%) and children under five (67.2%). Multivariable analysis indicated that age and pregnancy status were the strongest predictors of LLIN use, with ORs of 0.258 (p < 0.001) and 0.662 (p < 0.001), respectively. Major barriers identified included a 60.5% lack of confidence in hanging nets (p < 0.001) and a widespread misconception (64.1%) that malaria risk is restricted to the rainy season. Conclusion Although Ethiopia has made strides in LLIN ownership and prioritized protection for vulnerable demographics, overall utilization remains below the 80% threshold required for community-wide protection. To bridge the gap between ownership and consistent use, national strategies should transition toward skill-based interventions and targeted communication to address practical barriers and seasonal misconceptions.
Dye-Robinson, A.; Josey, K. P.; Jaramillo, D.; Dally, M.; Krisher, L.; Butler-Dawson, J.; Villarreal Hernandez, K.; Cruz, A.; Pilloni, D.; Adgate, J. L.; Schaeffer, J.; Johnson, R. J.; Chonchol, M.; Newman, L. S.
Show abstract
BackgroundChronic Kidney Disease of unknown etiology is a growing health concern in low-and middle-income countries. While occupational heat stress is recognized as a potential contributor to kidney dysfunction among agricultural workers, the causal relationship between heat stress, core body temperature (Tc), and kidney function remains unclear. MethodsWe conducted an observational study over two harvest seasons in Guatemala, following 148 male sugarcane workers across six months. Heat stress was measured using heat index (HI) and Tc with ingestible telemetric temperature pills. Particulate matter (PM) exposure was measured using personal breathing zone samplers worn during the work shift. We evaluated changes in kidney function using pre-and post-shift estimated glomerular filtration rate (eGFR). We applied G-computation to estimate causal effects and modeled hypothetical policy interventions reducing HI, Tc, and PM exposure, simulating occupational heat reduction strategies. ResultsThe average daily HI was 37.4 {degrees}C (SD: 2.0) with an average Tc increase of 1.16 {degrees}C (SD: 0.48) per shift. Both HI and Tc were associated with declines in eGFR across the work shift. At an HI of 34 {degrees}C, workers experienced an average eGFR decline of about 5 mL/min/1.73 m{superscript 2}, while at 40 {degrees}C the decline exceeded 16 mL/min/1.73 m{superscript 2}. High HI early in the season and elevated Tc later in the season contributed to kidney decline. A simulated intervention reducing HI exposure by 5% improved eGFR change by 1.46 mL/min/1.73 m{superscript 2}. PM exposure did not have a significant impact on eGFR decline. ConclusionReducing workday heat exposure may mitigate acute kidney function decline. These findings support the development of policy interventions aimed at reducing external heat exposure and internal heat strain to protect kidney health. More research is needed to investigate the potential contribution of other environmental factors, including PM exposure.
Man, I.; Macacu, A.; Eynard, M.; Adhikari, I.; Gini, A.; Georges, D.; Baussano, I.
Show abstract
Background: Public health decision modelling tools designed to inform cervical cancer prevention policies in low- and middle-income countries (LMICs) are useful but scarce. Important challenges herein are the often missing or inconsistently collected cervical cancer epidemiological data, and the lack of a systematic approach to deal with such data limitations. Methodology/Principal Findings: We developed a unified modelling platform and workflow to enable cervical cancer modelling in 132 LMICs based on the previously developed footprinting approach, through the following steps: 1) With sexual behavior data from the Demographic Health Surveys (DHS), which were available for a large number of LMICs (70/132), we identified clusters of countries which represent distinct patterns of human papillomavirus (HPV) transmission. The 7 resulting clusters correspond to a gradient of HPV prevalence and cervical cancer risk and exhibit clear geographical separation. 2) The remaining LMICs were classified into the identified clusters based on geographical proximity so that each LMIC was grouped to a cluster. Goodness of classification was validated using available epidemiological data. 3) We then calibrated the HPV transmission and cervical cancer progression models of the IARC/WHO METHIS platform to the 132 LMICS, first by cluster then by country, using the available data on sexual behavior (from DHS), HPV prevalence (from literature search), and cervical cancer incidence (from GLOBOCAN). Conclusions/Significance: A unified workflow and platform designed by IARC/WHO for public health decision modelling of cervical cancer prevention in 132 LMICs is now available. It is ready to be used to support global and local stakeholders to coordinate, design, and implement impactful and efficient prevention policies and will help to accelerate cervical cancer elimination.
Mohsini, K.; Gore-Langton, G. R.; Rathod, S. D.; Mansfield, K. E.; Warren-Gash, C.
Show abstract
Aims Indoor air pollution resulting from combustion of unclean cooking fuels has been linked to adverse health outcomes, but evidence regarding its association with mental health in low- and middle-income countries remains limited. We investigated the association between household use of unclean cooking fuels, as a proxy for indoor air pollution, and depression symptoms among adults aged 45 years and older in India, and assessed effect modification by age, sex, caste, and rural/urban residence. Methods We conducted a cross-sectional analysis of the first wave (2017-2018) of data from the Longitudinal Aging Study in India (LASI), a nationally representative survey of adults aged [≥]45 years. Cooking fuel type was classified as clean or unclean, and depression symptoms were assessed using the 10-item Centre for Epidemiologic Studies Depression (CES-D-10) scale. We used logistic regression to estimate odds ratios for depression symptoms, and linear regression to compare mean CES-D-10 scores by cooking fuel type, adjusting for sociodemographic and housing characteristics. Results We included 62,650 respondents. Median age was 57 years (IQR: 50-65), 46.7% were women, 47.6% reported using unclean cooking fuels, and 27.6% screened positive on the CES-D-10. After adjusting for sociodemographic and housing characteristics, use of unclean cooking fuels was associated with higher odds of screening positive on the CES-D-10 (aOR: 1.08; 95% CI: 1.02, 1.15), and higher mean CES-D-10 scores (adjusted mean difference: 0.34; 95% CI: 0.24, 0.44). The association was more pronounced among individuals living in urban areas (aOR: 1.36; 95% CI: 1.21, 1.53). Conclusion Use of unclean cooking fuels was associated with depression symptoms among older adults in India, and especially among those living in urban areas.
Qu, S.; Sillmann, J.; Barrett, B. W.; Graffy, P. M.; Poschlod, B.; Brunner, L.; Mansour, R.; Szombathely, M. v.; Hay-Chapman, F.; Horton, T. H.; Chan, J.; Rao, S. K.; Woods, K.; Kho, A. N.; Horton, D. E.
Show abstract
As climate change intensifies, health risks from extreme heat are rising. Accurate assessment of heat vulnerability at high spatial resolution is crucial for developing effective adaptation strategies, particularly in socioeconomically heterogeneous urban settings. However, the identification of key indicators underlying heat vulnerability remains challenging. Using Chicago, Illinois (USA) as a case study, we systematically compare different variable selection strategies in community-level heat vulnerability assessments. We take the conventional unsupervised principal component analysis (PCA)-based Heat Vulnerability Index (HVI) as a baseline, and compare it with supervised approaches that incorporate variable selection, including machine learning algorithms (Lasso regression, Random Forest, and XGBoost) as well as traditional statistical methods (simple linear regression and polynomial regression). Using the vulnerability indicator subsets identified by each variable selection method, we construct multiple HVIs and evaluate their performance against heat-related excess mortality. Our work indicates that supervised variable selection improves the performance of HVIs in capturing heat-related health risks. Among all methods, the Random Forest-based variable selection algorithm achieves the best overall results, highlighting the potential of machine learning to enhance heat vulnerability assessment tools. Our results demonstrate that poverty rate, lack of air conditioning, and proportion of residents aged 65 and above are robust determinants of heat vulnerability in Chicago.
Anshad, A. R.; Atchaya, M.; Saravanan, S.; Murugesan, A.; Balakrishnan, P.; Raju, S.; Yong, Y. K.; Larsson, M.; Shankar, E. M.
Show abstract
IntroductionSevere dengue infection is characterized by endothelial injury and systemic inflammatory complications. To better understand the mechanisms underlying disease severity, we investigated a broad panel of circulating inflammatory and endothelial mediators in patients with clinical dengue infection. MethodsA prospective cross-sectional case-control study was carried out involving 111 dengue patients and 42 healthy controls. Among the dengue cases, 85 were identified as primary, while 26 were classified as secondary dengue infections. Serum levels of endothelial markers (Ang-2, CXCL10, MCP1, TRAIL), acute-phase and liver dysfunction and acute-phase markers (CRP, galectin 3, and serum amyloid protein), systemic inflammatory mediators (MIF, TNF-, IL-1{beta}), mast cell-derived proteases (chymase, tryptase), and tissue repair markers HGF, IL-10, IL-1Ra) were quantified using ELISA and Luminex multiplex assays. Correlations among serum analytes, severity indicators, and haematological markers were also explored ResultsSeveral biomarkers, Ang-2, CXCL10, TRAIL, CRP, MIF, IL-1Ra, TNF-, and chymase showed differential expression across severity groups, indicating coordinated endothelial and inflammatory activation. Stratification of patients with primary-secondary dengue also followed a similar pattern except IL-1{beta}, which had significant differential expression across the cohorts. Ang-2 showed strong positive correlations with markers of hepatic dysfunction, including ALT, AST, and bilirubin, suggesting a link between endothelial injury and liver involvement. ConclusionsSevere dengue is driven by the coordinated activation of endothelial dysfunction, acute-phase responses, mast cell mediators, and counter-regulatory pathways. These processes collectively contribute to vascular leakage and organ injury, reinforcing the value of biomarkers such as Ang-2, CXCL10, CRP, and chymase for severity assessment.
Putney, N.; Sayyad-Hilario, J.; Ukawuba, I.; Grandesso, F.; Singh, S.; Safari, P. D.; Pothin, E.; Filippini, B.; Djovouna, E.; Diar, M. S. I.; Champagne, C.; Camacho, A.
Show abstract
Background Seasonal malaria chemoprevention (SMC) is a malaria intervention in which antimalarial drugs are administered monthly to children under 5 years of age during the high-transmission season. In the district of Moissala in southern Chad, SMC has been implemented since 2013, with an interruption in 2019, resumption in 2020, and expansion to five rounds of treatment in 2021. Recent World Health Organization (WHO) guidelines allow countries to adapt the timing and number of SMC rounds to local transmission patterns, creating a need to identify optimal strategies for each setting. In this study, we used mathematical modeling for three primary purposes: 1) to estimate the effectiveness of SMC in Moissala from 2018 to 2023, 2) to assess the impact of changes to SMC strategies since 2018, and 3) to determine the optimal SMC strategy in Moissala. Methods and findings We adapted a compartmental, climate-informed malaria transmission model to represent malaria dynamics in the presence of SMC. The model incorporates temperature and rainfall data to capture how climate variability influences malaria transmission over time. It was calibrated to routine surveillance data on malaria cases in children under five years old from 2018 to 2023. Using the calibrated model, we simulated malaria cases under alternative scenarios, including the absence of SMC and variations in the number and timing of SMC rounds. These simulations were then used to estimate the overall effectiveness of SMC, assess the impact of past changes in SMC strategies, and identify the optimal strategy in Moissala. Between 2018 and 2023, SMC reduced malaria cases in children under five by 26% (95% credible interval: 21%, 31%) relative to a scenario without SMC, corresponding to an average of approximately 14400 cases averted each year. The interruption of SMC in 2019 led to an estimated increase of 13600 cases (95% credible interval: 11200, 15800), representing a 31% rise during the high-transmission season. Expanding from four to five SMC rounds in 2021 reduced cases by 7% relative to a four-round schedule, while starting the five-round schedule earlier in June rather than July led to an additional 5% reduction. Overall, the most effective strategy from 2018 to 2023 was a five-round schedule beginning in mid-June. Conclusions Seasonal malaria chemoprevention has substantially reduced malaria incidence among children under five in Moissala. The currently implemented strategy of five rounds of SMC starting in June was estimated to achieve the greatest reduction in cases over the study period. Climate-informed modelling and open-source software can support timely decision-making across settings under changing climate and transmission conditions.
Halder, C. E.; Hasan, M. A.; Soma, E.; Charles Okello, J.; Rahman, M. M.; Das, P. P.; Prue, U. M.; Barasa, D. W.; Md, A.; Hosen, M. S.; Shagar, S. H.; CHONG, E. Y. C.; Paul, D.; Mowla, S. M. N.; Hoque, M.; Bhuiyan, A. T. M.; Hussain, M. F.
Show abstract
Background: Dengue emerged as a new public health threat in the Rohingya refugee camps in Coxs Bazar, Bangladesh, in 2021 and expanded into large-scale upsurges in subsequent years. Evidence on dengue epidemiology and clinical presentation in protracted refugee settings remains limited, despite the need for stronger outbreak preparedness and case management in these contexts. Objectives: To describe the epidemiological, clinical, and diagnostic characteristics of the dengue upsurge among Rohingya refugees and surrounding host communities in Coxs Bazar, Bangladesh, and to identify predictors of inpatient admission and diagnostic positivity patterns. Methods: This retrospective observational study used anonymized surveillance data from the International Organization for Migration (IOM) dengue patient database. Rapid diagnostic test (RDT)-confirmed dengue cases identified across 36 IOM-supported health facilities in Ukhiya and Teknaf between 1 October 2021 and 31 December 2024 were included. Demographic, epidemiological, clinical, and laboratory variables were summarized using descriptive statistics. Weekly incidence was aggregated by epidemiological week and calendar year, and epidemic growth and decay phases were modelled using phase-specific Poisson regression. Multivariable logistic regression was used to identify predictors of inpatient admission and to examine associations between delay in presentation and RDT positivity patterns, adjusting for age and sex. Results: A total of 35,581 RDT-confirmed dengue cases were reported, of which 90.2% occurred among Rohingya refugees. The median age was 17 years (IQR 7-30), and 46.0% of cases were among children aged 0-14 years. Annual caseload increased from 1,011 in 2021 to 11,752 in 2022, 10,669 in 2023 and 12,149 in 2024, with seasonal peaks during the monsoon period and progressively later peaks and longer epidemic tails over time. Poisson models showed decreasing growth rates across years (r=0.449 in 2021 to r=0.091 in 2024) with increasing doubling times, while decay rates remained broadly comparable (halving time ~4.4-6.0 weeks). Overall, 8.0% of cases required inpatient admission, 1.3% were referred, and four deaths were reported (case fatality <0.1%). In multivariable analysis, inpatient admission was associated with older age ([≥]60 vs 0-14: aOR 2.31), delayed presentation (aOR 1.06 per day), refugee status (aOR 1.39), presence of any World Health Organization (WHO) warning sign (aOR 26.60), low systolic BP (aOR 2.84) and chronic co-morbidity (aOR 6.07). In addition, males had lower odds of admission than females (aOR 0.88). NS1 antigen alone was positive in 62.1% of cases, IgM alone in 33.6%, and dual positivity in 4.3%. Longer delay from symptom onset to presentation was strongly associated with IgM-only positivity compared with NS1-only positivity (adjusted models controlling for age and sex). Conclusion: Sustained dengue preparedness is required in Coxs Bazar, including strengthened surveillance, community-based early referral, targeted monitoring of high-risk groups, environmental vector control, and phase-appropriate use of NS1 and IgM/IgG diagnostics to reduce missed diagnoses and prevent progression to severe disease. These findings highlight the need for a policy shift from episodic outbreak response toward sustained dengue preparedness in humanitarian settings, including strengthened surveillance systems, integrated diagnostic strategies, community-based early referral, and coordinated vector control interventions.
Andrei, F.; Tizzoni, M.; Veltri, G. A.
Show abstract
Background: Dengue is rapidly emerging in parts of Europe. How households value vector control attributes, and whether inferences depend on decision models or message framing, is unclear. Methods: We conducted a split-ballot online experiment among adults in Italy and France, as well as a hotspot subsample from Marche, Italy. National samples included 1,505 respondents in Italy and 1,501 in France; 183 respondents were recruited in Marche. Participants were randomised to a discrete choice experiment (random utility maximisation) or a regret-based choice experiment (random regret minimisation) and to one of three pre-task messages (control, loss aversion, community values). Each respondent completed 12 choice tasks comparing two dengue control programmes and an opt-out. We estimated mixed logit and mixed random-regret models with random parameters and treatment effects. Results: Across frameworks, nearby cases and high mosquito prevalence were the dominant drivers of programme uptake, whereas cost and operational burden were secondary. In pooled analyses, loss-aversion messaging increased the weight on high mosquito prevalence in both models (from 0.483 to 0.547 in the utility model; from 0.478 to 0.557 in the regret model). Cost effects were small nationally but larger in the hotspot subsample. Conclusions: Risk salience dominates preferences for dengue vector control in these European settings. Random utility and random regret models yield consistent rankings of attributes but differ in behavioural interpretation and some secondary effects; messaging effects were modest and context dependent.
Alshahrani, A. M.; Ashour, A. m.
Show abstract
Background Vaccine-preventable diseases (VPDs) continue to impose a significant health and economic burden globally, despite advances in immunization programs. Narrower to the context of Saudi Arabia, the current literature consistently shows that the high vaccination coverage has had the primary impact of reducing disease incidence. Regardless, the broader economic impact of VPDs and the financial benefits of immunization remain important for policy evaluation within Saudi Arabia. Methods This study employed a model-based economic evaluation using a societal perspective in order to carry out an estimation of the economic burden of measles, influenza, and pneumococcal diseases. We utilized the Cost of Illness (COI) approach for the purpose of quantifying direct medical costs and indirect productivity losses. On the other hand, the Value of Statistical Life (VSL) approach helped in the estimation of the monetary value of mortality reduction. A comparative framework analyzed current vaccination coverage against a counterfactual no-vaccination scenario for the calculation of the return on investment (ROI). Results The estimated annual economic burden of the three selected VPDs in the absence of vaccination was USD 385 (95% CI: 315-460) million. Immunization programs generated substantial economic benefits, with total benefits estimated at USD 1085 (95% CI: 815-1360) million annually. The calculated ROI was 9.0 (95% CI: 6.8-11.3), essentially an indication that for each dollar invested in vaccination, there was multiple economic returns yielded. Sensitivity analyses confirmed the robustness of these findings. Conclusion Immunization programs in Saudi Arabia provide significant economic and public health benefits and for this reason, sustained investment in vaccination is fundamentally essential towards the reduction of disease burden, improve population health, and ultimately support long-term economic productivity.
Bastard, J.; Migne, C.; Helle, T.; Agneray, E.; Bigeard, C.; Boudjadi, Y.; Chevrier, M.; Dumarest, M.; Gondard, M.; Martin-Latil, S.; Mathews-Martin, L.; Petit, T.; Charpentier, T.; Pouillevet, H.; Durand, B.; Metras, R.; Gonzalez, G.
Show abstract
Zoos may serve as sentinel sites for zoonotic vector-borne diseases. West Nile virus (WNV) and Usutu virus (USUV) are closely related orthoflaviviruses transmitted between Culex mosquitoes and a bird reservoir. Both viruses can also infect mammals, including humans, where they may cause symptoms and, more rarely, hospitalization and death. However, serological cross-reactivity between WNV and USUV complicates their differential diagnosis. Here, we aimed to reconstruct the dynamics of emergence of WNV in a zoo located in a newly affected area in Europe, using ELISA and Virus Neutralization Test (VNT) serological analysis of 1707 animal sera collected between 2015 and 2024. Combining this data in a model accounting for cross-reactivity with USUV, we estimated yearly forces of infection (FOI) by both viruses, and thus found that WNV likely circulated in the area one year prior to the first cases reported to the passive surveillance system. Our results also showed that, in the zoo, mammals and reptiles had a lower risk of infection than birds (relative risk of 0.14 [0.05; 0.28]), and that the exposure of birds to water (aquatic lifestyle or proximity to stagnant water) affected the risk. Finally, we estimated diagnosis parameters, including the sensitivity of the VNT (80.4% [76.5%; 84.3%]), the expected VNT titer value, and the level of serological cross-reactivity between viruses during the VNT. To conclude, our modelling framework allowed to disentangle the co-circulation of two closely related viruses, a crucial point in ensuring the reliable sentinel surveillance of these vector-borne zoonotic pathogens.
Neveux, M. R.; Hylin, R.; Ruiz Gonzalez, V.; Lopez Parra, A. P.; Onoja, A.; Sirivichayakul, S.; Sulaimon, A. A.; Rwibasira, G.; Ochwoto, M.
Show abstract
IntroductionDiagnostics have become the fundamental backbone of HIV prevention, treatment and long-term retention in care, and are critical to achieving the 95-95-95 UNAIDS targets. To effectively reach underserved and remote populations, diagnostic technologies must be cost-effective, robust, user-friendly and suitable for settings with limited infrastructure. Among available testing modalities, rapid diagnostic tests (RDTs) play a central role in expanding HIV testing coverage. Earlier generations of RDTs were limited by their inability to detect acute HIV, with limited ability to detect p24 antigen (Ag), an early marker of HIV infection, which is expected to shorten the diagnostic window to two-to-three weeks. The introduction of fourth-generation RDTs, which detects both chronic and acute HIV infection through p24 Ag detection, was designed to ensure that the traditional diagnostic window of two-to-three months is shortened to approximately two-to-three weeks. However, integrating these assays into existing testing algorithms requires clear evidence that they meet high standards of quality and performance. This systematic review aims to assess the performance of WHO-prequalified fourth-generation Ag/Ab RDTs. MethodsWe performed a systematic search across six databases to identify studies evaluating Ag/Ab RDTs against laboratory reference standards in individuals aged 12 years and older, spanning 1 January 2010 to 31 December 2025. Outcomes were limited to measures of diagnostic accuracy. A meta-analysis focusing exclusively on WHO-prequalified fourth-generation RDTs was performed using a bivariate random-effect model. Results1,932 records were screened, of which 31 diagnostic accuracy studies from 19 countries were included. 15 studies used US-only approved products, 12 used WHO-prequalified products and four used commercially discontinued products. The pooled sensitivity of WHO-prequalified Ag/Ab RDTs for acute HIV infection (AHI) was 94% (95% CI: 86%-99%). An RNA threshold of [≥] 1,000,000 copies/mL was used as a proxy for high viraemia and used as a cut-off for the following analyses. The cut-off based analysis is considered more suited to decision-making, as it focuses on cases most likely to be associated with higher viraemia and greater potential for detection during the p24 Ag window. When using enzyme immunoassay (EIA) as the reference standard, the pooled p24 Ag sensitivity was 76% (95% CI: 62%-88%), and the pooled p24 Ag sensitivity when using nucleic acid amplification test (NAAT) as the reference standard was 75% (95% CI: 41%-97%). In the general population, the pooled sensitivity for p24 antigen detection was 77% (95% CI: 60%-92%). Amongst risk populations, only three studies had available raw data, and the pooled sensitivity was 62% (95% CI: 10%-97%). In plasma and serum specimens, pooled p24 Ag sensitivity was 74% (95% CI: 57%-88). DiscussionCollectively, these findings indicate that WHO-prequalified fourth-generation Ag/Ab RDTs can function as a scalable frontline screening tool, particularly in low- and middle-income countries, while offering incremental holistic detection through p24 Ag. Their effective deployment, however, depends on maintaining standard algorithm safeguards, including repeat testing and targeted laboratory referral when acute infection is suspected. ConclusionsResults from this meta-analysis support the use of WHO-prequalified fourth-generation Ag/Ab RDTs for general population screening. From a programmatic perspective, the added value of WHO-prequalified fourth-generation RDTs lies in their ability to combine rapid, decentralized access to testing, with incremental yet impactful improvements in holistic detection.
Abukunna, F.; Matamala Luengo, D.; Martin Manrique, A.; Duruanyanwu, J.; Sherwood, M.; Patel, P.; Crabtree, M.; Birdsey, G. M.; Maringer, K.; Campagnolo, P.
Show abstract
Dengue virus (DENV) infection is a major global health threat, affecting more than half of the worlds population. Severe dengue is a life-threatening condition characterised by systemic bleeding, vascular leakage, and interstitial fluid accumulation that can progress to hypovolaemic shock. Circulating DENV non-structural protein 1 (NS1) has long been implicated in driving vascular hyperpermeability through its disruptive effects on endothelial cell junctions and the glycocalyx. The lymphatic system, which runs alongside the vascular network, plays a critical role in resorbing and recirculating interstitial fluid and immune cells extravasated from blood vessels. Despite its importance in maintaining tissue fluid homeostasis, the impact of dengue disease on lymphatic vessels has not previously been explored. Here, we present the first evidence that DENV-2 NS1 induces marked hyperpermeability in lymphatic endothelial cells, as measured by transendothelial electrical resistance, and impairs lymphangiogenesis in vitro. These effects were not attributable to changes in cell viability, morphology, or metabolic activity, as assessed by live/dead and metabolic assays and image analysis. Instead, we observed a defect in lymphatic endothelial cell migration, measured by scratch assay, which may underlie the reduced lymphangiogenic potential. Bulk RNA-seq, immunocytochemistry, and advanced image analysis further demonstrated pronounced reorganisation of cell-cell junctions, the cytoskeleton, and focal adhesions. Notably, junctional proteins including VE-cadherin, ZO-1, and Claudin-5 were not downregulated but instead displayed disorganised distribution along the cell junctions or aberrant cytoplasmic localisation. These structural disruptions became even more pronounced under flow conditions produced using a microfluidic system. Together, these findings demonstrate for the first time that DENV-2 NS1 directly disrupts lymphatic endothelial cell function, leading to junctional disorganisation and hyperpermeability. Such impairment of lymphatic drainage may contribute to the pathophysiology of severe dengue. Author SummaryDengue is a rapidly expanding mosquito-borne disease that now affects many tropical and subtropical regions worldwide. Severe cases can lead to extensive fluid leakage from blood vessels, which causes tissue swelling and, in the most dangerous situations, shock. Although much research has focused on how dengue damages the blood vascular system, almost nothing is known about its impact on the lymphatic system, which is responsible for removing fluid from tissues and returning it to the bloodstream. Because both systems work together to maintain fluid balance, understanding how dengue affects lymphatic vessels is important for explaining why fluid accumulation becomes so severe in critical disease. In our study, we examined whether the viral protein NS1, which circulates during infection, directly affects the cells that line lymphatic vessels. We found that NS1 increases the permeability of these cells and reduces their ability to form new vessel structures. These effects were not caused by cell death but by disruptions in how the cells organise their junctions, internal scaffolding, and interactions with neighbouring cells. By showing that NS1 can directly impair lymphatic vessel function, our work identifies a previously overlooked mechanism that may contribute to fluid build-up in severe dengue and suggests new avenues for future therapeutic research.
Koyra, A. B.; Mohammed, F.; Eshete, T.
Show abstract
BackgroundFamily-based HIV index case testing identifies family members with unknown HIV status and links them to care. Data are limited in southern Ethiopia. MethodsA facility-based cross-sectional study was conducted among 377 adults on antiretroviral therapy (ART) in Wolaita Zone, Southern Ethiopia, from November 2022 to May 2023. Participants were selected using systematic random sampling. Data were collected via interviewer-administered semi-structured questionnaire. Multivariable logistic regression identified factors associated with index case family testing. Adjusted odds ratios (AOR) with 95% confidence intervals (CI) were calculated, and statistical significance was declared at p < 0.05. ResultsThe proportion of index case family testing for HIV was 84.9% (95% CI: 81.2- 88.6). In multivariable analysis, urban residence (AOR = 2.8; 95% CI: 1.16-6.75), duration on ART greater than 12 months (AOR = 13.0; 95% CI: 4.6-36.9), disclosure of HIV status to family members (AOR = 5.6; 95% CI: 1.9-16.5), discussion of HIV status with family members (AOR = 6.6; 95% CI: 1.9-23.2), and being counselled by health professionals to bring families for testing (AOR = 6.3; 95% CI: 2.1-19.0) were significantly associated with index case family testing. ConclusionThe prevalence of family-based HIV index case testing in Wolaita Zone was 84.9%, below the national 95% target. Health professionals should strengthen counselling on ART adherence, status disclosure, family discussion, and active referral to improve testing uptake among family members of people living with HIV.